Tunable Sparse Network Coding
نویسندگان
چکیده
A fundamental understanding of the relationship between delay performance and complexity in network coding is instrumental towards its application in practical systems. The main argument against delay-optimal random linear network coding (RLNC) is its decoding complexity, which is O(n) for n original packets. Fountain codes, such as LT and Raptor codes, reduce the decoding load on the receiver but at the cost of introducing additional, non-negligible delay. The source of this increased delay is the inherent sparsity of the code, which significantly reduces the impact of a new coded packet, i.e., the probability of a packet to be linearly independent from the knowledge at the receiver, as we approach the end of the transmission (when few degrees of freedom are missing). Thus, the additional overhead is mainly due to the transmission of the last data packets. Our key observation is that switching gears to denser codes as the transmission process progresses could considerably reduce the delay overhead while maintaining the complexity advantages of a sparse code. We propose tunable sparse network coding as a dynamic coding mechanism with a code structure with two regions: a sparse region, with various levels of sparsity, and a dense region, where packets are generated as per RLNC. We characterize the problem in multicast sessions on general networks and illustrate trade-offs in especial cases of interest. We also present a novel mechanism to perform efficient Gaussian elimination for sparse matrices that guarantees linear decoding complexity with high probability.
منابع مشابه
Rice Classification and Quality Detection Based on Sparse Coding Technique
Classification of various rice types and determination of its quality is a major issue in the scientific and commercial fields associated with modern agriculture. In recent years, various image processing techniques are used to identify different types of agricultural products. There are also various color and texture-based features in order to achieve the desired results in this area. In this ...
متن کاملFace Recognition using an Affine Sparse Coding approach
Sparse coding is an unsupervised method which learns a set of over-complete bases to represent data such as image and video. Sparse coding has increasing attraction for image classification applications in recent years. But in the cases where we have some similar images from different classes, such as face recognition applications, different images may be classified into the same class, and hen...
متن کاملTraffic Scene Analysis using Hierarchical Sparse Topical Coding
Analyzing motion patterns in traffic videos can be exploited directly to generate high-level descriptions of the video contents. Such descriptions may further be employed in different traffic applications such as traffic phase detection and abnormal event detection. One of the most recent and successful unsupervised methods for complex traffic scene analysis is based on topic models. In this pa...
متن کاملSupervised Deep Sparse Coding Networks
In this paper, we propose a novel multilayer sparse coding network capable of efficiently adapting its own regularization parameters to a given dataset. The network is trained end-to-end with a supervised task-driven learning algorithm via error backpropagation. During training, the network learns both the dictionaries and the regularization parameters of each sparse coding layer so that the re...
متن کاملSupport Regularized Sparse Coding and Its Fast Encoder
Sparse coding represents a signal by a linear combination of only a few atoms of a learned over-complete dictionary. While sparse coding exhibits compelling performance for various machine learning tasks, the process of obtaining sparse code with fixed dictionary is independent for each data point without considering the geometric information and manifold structure of the entire data. We propos...
متن کامل